Local Shrunk Discriminant Analysis (LSDA)
نویسندگان
چکیده
Dimensionality reduction is a crucial step for pattern recognition and data mining tasks to overcome the curse of dimensionality. Principal component analysis (PCA) is a traditional technique for unsupervised dimensionality reduction, which is often employed to seek a projection to best represent the data in a least-squares sense, but if the original data is nonlinear structure, the performance of PCA will quickly drop. An supervised dimensionality reduction algorithm called Linear discriminant analysis (LDA) seeks for an embedding transformation, which can work well with Gaussian distribution data or single-modal data, but for non-Gaussian distribution data or multimodal data, it gives undesired results. What is worse, the dimension of LDA cannot be more than the number of classes. In order to solve these issues, Local shrunk discriminant analysis (LSDA) is proposed in this work to process the nonGaussian distribution data or multimodal data, which not only incorporate both the linear and nonlinear structures of original data, but also learn the pattern shrinking to make the data more flexible to fit the manifold structure. Further, LSDA has more strong generalization performance, whose objective function will become local LDA and traditional LDA when different extreme parameters are utilized respectively. What is more, a new efficient optimization algorithm is introduced to solve the non-convex objective function with low computational cost. Compared with other related approaches, such as PCA, LDA and local LDA, the proposed method can derive a subspace which is more suitable for non-Gaussian distribution and real data. Promising experimental results on different kinds of data sets demonstrate the effectiveness of the proposed approach.
منابع مشابه
An Adaptive Neighborhood Choosing of the Local Sensitive Discriminant Analysis Algorithm
The curse of dimensionality is a problem of machine learning algorithm which is often encountered on study of high-dimensional data, while LSDA (Locality Sensitive Discriminant Analysis) can solve the problem of curse of dimensionality. However, LSDA can not fully reflect the requirements that the manifold learning for neighborhood, by using the adaptive neighborhood selection method to measure...
متن کاملLocality Sensitive Discriminant Analysis
Linear Discriminant Analysis (LDA) is a popular data-analytic tool for studying the class relationship between data points. A major disadvantage of LDA is that it fails to discover the local geometrical structure of the data manifold. In this paper, we introduce a novel linear algorithm for discriminant analysis, called Locality Sensitive Discriminant Analysis (LSDA). When there is no sufficien...
متن کاملExploiting Known Taxonomies in Learning Overlapping Concepts
Linear Discriminant Analysis (LDA) is a popular data-analytic tool for studying the class relationship between data points. A major disadvantage of LDA is that it fails to discover the local geometrical structure of the data manifold. In this paper, we introduce a novel linear algorithm for discriminant analysis, called Locality Sensitive Discriminant Analysis (LSDA).When there is no sufficient...
متن کاملLocality sensitivity discriminant analysis-based feature ranking of human emotion actions recognition
[Purpose] Computational intelligence similar to pattern recognition is frequently confronted with high-dimensional data. Therefore, the reduction of the dimensionality is critical to make the manifold features amenable. Procedures that are analytically or computationally manageable in smaller amounts of data and low-dimensional space can become important to produce a better classification perfo...
متن کاملLSDA Solution Schemes for Modelless 3D Head Pose Estimation
Locality Sensitive Discriminant Analysis (LSDA) is a recent linear manifold learning method used in pattern recognition and computer vision. Whenever LSDA is used for face image analysis, it suffers from a number of problems including the Small Sample Size (SSS) problem, and that its classification performance seems to be heavily influenced by its parameters. In this paper, we propose two novel...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1705.01206 شماره
صفحات -
تاریخ انتشار 2017